engineering effort
the NLLs in the final version of the paper in addition to reporting averages and standard deviations in all of our other 3 tables by running more trials
We agree with all three reviewers that evaluating the predictive variances is important. Thank you for your comments and suggestions. Finally, we will clarify that SGPR is by (Titsias, 2009) and SVGP is by (Hensman et al., 2013). This has important ramifications, e.g., We were unaware of Nguyen's paper at submission and we will add this discussion to the paper. We note that the precomputation, like CG, can be run to a specified desired tolerance. Hensman et al. (2013) used 1000 inducing points on the massive Airline dataset.
Review for NeurIPS paper: Kernel Methods Through the Roof: Handling Billions of Points Efficiently
There is a consensus among the knowledgeable reviewers that this work makes a significant contribution to the kernel community. It integrates several practical techniques and engineering efforts to further improve the scalability of the kernel machines. The techniques proposed in this work will permit the use of several GPUs in training kernel-based models with huge amount of data, which I also see as a significant contribution. Regardless of the overall score, I think this paper deserves an oral because it shows how to take full advantage of GPU hardware when solving learning problems with kernels methods. Scalability is one of the long-standing problems in kernel machines but has been largely neglected and under-appreciated in the past few years.
AI-engineered enzyme eats entire plastic containers
A plastic-degrading enzyme enhanced by amino acid changes designed by a machine-learning algorithm can depolymerise polyethylene terephthalate (PET) at least twice as fast and at lower temperatures than the next best engineered enzyme. Six years ago scientists sifting through debris of a plastic bottle recycling plant discovered a bacterium that can degrade PET. The organism has two enzymes that hydrolyse the polymer first into mono-(2-hydroxyethyl) terephthalate and then into ethylene glycol and terephthalic acid to use as an energy source. One enzyme in particular, PETase, has become the target of protein engineering efforts to make it stable at higher temperatures and boost its catalytic activity. A team around Hal Alper from the University of Texas at Austin in the US has created a PETase that can degrade 51 different PET products, including whole plastic containers and bottles.
- North America > United States > Texas > Travis County > Austin (0.25)
- Europe > United Kingdom > England > Greater London > London (0.05)
Architectural design of AI software: the 3 layers
This post is the first in a series that will highlight the similarities and differences of AI software development with regards to non-AI software development. In this article, we will focus on the software architecture of a complete AI solution. Developing Artificial Intelligence (AI) software components using techniques such as Deep Learning (DL) or Machine Learning (ML) implies some changes in the way you produce a software solution. In "traditional" software development (later written non-AI software), software engineers write source code in a programming language (python, java, C, etc.) to implement an algorithm. On the other hand, AI software development does not involve that much coding.
How Can Businesses Best Leverage Data Analytics?
What's the best way for businesses to leverage AI and data analytics in this software-driven market? With all the hand-waving around "AI," it's easy to get buzzword exhaustion. But it's really important that we don't lose sight of the reality: machine learning will fundamentally change the way we illuminate and understand engineering performance. Consider how things like sabermetrics have revolutionized the way GMs, coaches and players understand and unlock high performance[1]. Now multiply that impact by some big number, given how essential software engineering to the modern world, and how much sheer activity data (issues worked, code committed, scans run, deployments made, etc. etc.) is produced in the course of building software.